perm filename LUSER.2[1,JRA] blob sn#484968 filedate 1979-10-26 generic text, type C, neo UTF8
COMMENT āŠ—   VALID 00002 PAGES
C REC  PAGE   DESCRIPTION
C00001 00001
C00002 00002	.page frame 80 wide 64 high
C00014 ENDMK
CāŠ—;
.page frame 80 wide 64 high;
.DEVICE XGP;
.FONT 1 "ngr25";		<<normal font>>

Third LISP Newsletter: Sept 27, 1979(begun) Oct 13(finished) NO! oct 26

Here's number three; late as usual.  Hopefully, by this time you have seen
the August issue of  BYTE magazine. They are  somewhat more dependable  in
their production schedule than I am.

As I mentioned in the last letter, I expect RESPONSE from you.  I received
only three notes from about 40 mailings. I assume you got the letter.

Several interesting  things have  been  going on  since the  last  letter.
First, I observed some of the "goings on" at the UC Santa Cruz Programming
Methodology Summer  Short Course.   There  was a  strong  anti-interactive
atmosphere; not  just anti-interactive  programming, but  anti-interactive
editing as well. One expected to hear the old anti time-sharing  arguments
from the early 60's again.  The general effect was most depressing.

Partly as a reaction to this and  partly because it is still a good idea,  I
have resurrected my plan to have a LISP Conference next year.  The  thrust
of the  Conference is  the non-AI  aspects of  LISP-related  developments:
languages, programming, theory,  and applications;  yes, even  programming
methodology, from the LISP point-of-view.  It will be August 24-27 1980 at
Stanford.  Here's a general outline of what's up.

Architecture:  projects  range from  re-microcoded commercially  available
hardware to specially designed LISP chips.

Languages and Theory: applicative languages, object-oriented languages and
formal semantics of LISP-like languages

Programming  and  Environments:   LISP's  flexible  programming  behavior,
including the support systems which surround the language.

Applications: non-traditional applications-- mathematics, music, graphics,
and what-have you.

The conference should be a very  exciting and positive event for  computer
science.

The program committee is:  John R. Allen, Bruce Anderson, Richard Fateman,
Dan Friedman, Eiichi Goto, Patrick Greussay, Tony Hearn, Carl Hewitt, Alan
Kay, Peter Landin, Joachim Laubsch, John McCarthy, Gianfranco Prini,  Erik
Sandewall, Carolyn Talcott, and David Wise.

----- Lots of micro LISPs are creeping  around now (I will soon enter  the
fray, as The LISP  Company, with one  for the Z-80).  There are two  major
problems which must be dealt with: some implementations are "toys", either
in capabilities or in speed or  BOTH. Such implementations are based on  a
simplified view  of what  LISP is.  For example,  "theoretically" LISP  is
definable in terms of five simple primitives and conditional  expressions,
plus an implementation of the run-time system (some toys don't even supply
all of  that) and  some simple  I/O routines.   Such LISP's  do little  to
dispell the myth  that LISP  is an unpleasant  collection of  parentheses,
unmanageable, dull,  and slow.  A  production LISP  is further  from  this
toy-view than, say, UCSD Pascal is from an trivial arithmetic programming
language which  only supplies  the addition  operation, if-then-else,  and
numeric reading and  printing functions.  A production LISP  is a  systems
programmer's  tool  box,  full  of  data  structuring  devices,  scanners,
parsers,  unparsers,   symbol   table  organizers,   compilers,   editors,
debuggers, ...  all  INTEGRATED into  a  uniformly accessible  system.  To
advertize LISP as something less,  is a disservice.  Economic reality  can
creep in  of course;  a  truly integrated  system requires  an  iteractive
display system which is currently  beyond reasonable pricing for  personal
systems. However,  the concept  --the ideal--,  must not  be  compromised.
Alas, many of the new micro  implementations view LISP in this  restricted
perspective.

The second problem is just the proliferation of LISP dialects. This can be
as  deadly  as  the  preceding   problem.   I  am  generally  opposed   to
standardization   efforts,   however;   I   expect   to   see   de   facto
standardization, occur out of usage not  by edict, I expect some of our
Newsletter efforts to build toward such a consensus.

Now, let me try to answer some questions.

Books: 
Artificial Intelligence Programming by Charniak, Reisbeck, and McDermott
is a brand-new book, which covers much of LISP and AI quite well. Definitely
recommended. The first half is LISP programming with a bit about implementation.
There is an admirable passage in which one author (McDermott) describes how
he learned the pitfalls of programming at too low a level in LISP.
In case you haven't guessed by now, low-level LISP hacking is one of my
pet peeves.

Anatomy of LISP by me.
My book is not  meant to be an introduction to LISP programming;
i've been told it succeedes at that admirably.
It is aimed at a wider audience, teaching about computer science with LISP
(in fact a somewhat  purified LISP) as  its vehicle.  It discusses
abstract programming ideas in LISP, a technique which most LISP programmers
fail to apply effectively. It describes implementation techniques at this abstract
level (interpreters, compilers, ...) and moves down in layers from the abstract
to the concrete, ending with the usual bit diddling that gets taught as
data structures. Very little is said about actual LISP programming systems, simply
because most of them are inflicted with some defect. Either the dialect 
that got implemented is  diseased, or the interactive aspects got short-changed;
usually both. What is needed is a brand-new start; that was not  a fit subject for
a university text.

Neither of these books is  effective as a self teaching guide to LISP programming.
That is a void which I am filling soon; the documentation of TLC's LISP
will grow into a book over the next year. 

Randomness:
A couple asked about smart interpreters that can evaluate recursive functions 
in an iterative fashion. That  means, basically, that the interpreter
won't stack arguments and returns, when it makes a subcall, but used
fixed storage for the arguments and "jumps" to the beginning of the
function. Patrick Greussay at the University of Paris has done the most
with this idea; it is an extension of a notion called "tail recursion".
The U. Paris LISP (VLISP) compiler also knows about tail recursion and will
often compile uncanny code.

As for learning about applicative programming: the major problem is UNLEARNING
the crappy Fortran style most of us grew up with. John Backus, the father
of Fortran has seen the light; he is a strong advocate of
applicative languages. I have much more sympathy for Fortran than for 
Pascal, say. Fortran was a major breakthrough in programming languages, done
in a period when little was known about programming; their errors can be attributed
to a lack of knowledge. At that time Fortran was an incredible advance, however
Fortran is like a tenured senile professor who continues to teach long after
he should retire. Perhaps we need an "emeritus language"  position.

PLEASE, PLEASE, PLEASE!! Send your experiences with LISP, reviews of your
favorite LISP-ish books, reviews of implemmentations that you have access to,
and anything else that you feel the Newsletter reader might profit from.
This letter is supposed to reflect all your interests and concerns.

Don't give up. The next letter will be faster and fatter.
Hopefully it will contain some solid info on what micro-lisp's are
available and what their capabilities are.



john